A Comparative Study of Mutual Information Analysis under a Gaussian Assumption

نویسندگان

  • Amir Moradi
  • Nima Mousavi
  • Christof Paar
  • Mahmoud Salmasizadeh
چکیده

In CHES 2008 a generic side-channel distinguisher, Mutual Information, has been introduced to be independent of the relation between measurements and leakages as well as between leakages and data processed. Assuming a Gaussian model for the side-channel leakages, correlation power analysis (CPA) is capable of revealing the secrets efficiently. The goal of this paper is to compare mutual information analysis (MIA) and CPA when leakage of the target device fits into a Gaussian assumption. We first theoretically examine why MIA can reveal the correct key guess amongst other hypotheses, and then compare it with CPA proofs. As our theoretical comparison confirms and shown recently in ACNS 2009 and CHES 2009, the MIA is less effective than the CPA when there is a linear relation between leakages and predictions. Later, we show detailed practical comparison results of MIA and CPA, by means of several alternative parameters, under the same condition using leakage of a smart card as well as of an FPGA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparative Analysis of Image Denoising Methods Based on Wavelet Transform and Threshold Functions

There are many unavoidable noise interferences in image acquisition and transmission. To make it better for subsequent processing, the noise in the image should be removed in advance. There are many kinds of image noises, mainly including salt and pepper noise and Gaussian noise. This paper focuses on the research of the Gaussian noise removal. It introduces many wavelet threshold denoising alg...

متن کامل

Variational Information Maximization in Gaussian Channels

Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector x and its low dimensional represen...

متن کامل

Variational Information Maximization and (K)PCA

Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector x and its low dimensional represen...

متن کامل

The Structure of Least-Favorable Noise in Gaussian Vector Broadcast Channels

The sum capacity of the Gaussian vector broadcast channel is the saddle point of a Gaussian mutual information game in which the transmitter maximizes the mutual information by choosing the best transmit covariance matrix subject to a power constraint, and the receiver minimizes the mutual information by choosing a least-favorable noise covariance matrix subject to a diagonal constraint. This r...

متن کامل

An Information-Theoretic Derivation of the MMSE Decision-Feedback Equalizer

We consider the discrete-time Gaussian channel with inter-symbol interference (ISI). Under the assumption of perfect feedback, an information-theoretic derivation of the minimum mean-squared error (MMSE) decision-feedback equalizer (DFE) is presented. Whereas previous works have used information theory to analyze the zero-forcing and MMSE DFE structures, this work derives the MMSE DFE directly ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009